Some Convergence Results on the Regularized Alternating Least-Squares Method for Tensor Decomposition

نویسندگان

  • Na Li
  • Stefan Kindermann
  • Carmeliza Navasca
چکیده

We study the convergence of the Regularized Alternating Least-Squares algorithm for tensor decompositions. As a main result, we have shown that given the existence of critical points of the Alternating Least-Squares method, the limit points of the converging subsequences of the RALS are the critical points of the least squares cost functional. Some numerical examples indicate a faster convergence rate for the RALS in comparison to the usual alternating least squares method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ar X iv : 1 50 7 . 04 72 1 v 2 [ m at h . N A ] 2 1 Ju l 2 01 7 ON ACCELERATING THE REGULARIZED ALTERNATING LEAST SQUARE ALGORITHM FOR TENSORS

In this paper, we discuss the acceleration of the regularized alternating least square (RALS) algorithm for tensor approximation. We propose a fast iterative method using a AitkenStefensen like updates for the regularized algorithm. Through numerical experiments, the fast algorithm demonstrate a faster convergence rate for the accelerated version in comparison to both the standard and regulariz...

متن کامل

Iterative Methods for Symmetric Outer Product Tensor Decomposition

We study the symmetric outer product for tensors. Specifically, we look at decomposition of fully (partially) symmetric tensor into a sum of rank-one fully (partially) symmetric tensors. We present an iterative technique for the third-order partially symmetric tensor and fourthorder fully and partially symmetric tensor. We included several numerical examples which indicate a faster convergence ...

متن کامل

On the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors

Tensor decomposition has important applications in various disciplines, but it remains an extremely challenging task even to this date. A slightly more manageable endeavor has been to find a low rank approximation in place of the decomposition. Even for this less stringent undertaking, it is an established fact that tensors beyond matrices can fail to have best low rank approximations, with the...

متن کامل

SPALS: Fast Alternating Least Squares via Implicit Leverage Scores Sampling

Tensor CANDECOMP/PARAFAC (CP) decomposition is a powerful but computationally challenging tool in modern data analytics. In this paper, we show ways of sampling intermediate steps of alternating minimization algorithms for computing low rank tensor CP decompositions, leading to the sparse alternating least squares (SPALS) method. Specifically, we sample the Khatri-Rao product, which arises as a...

متن کامل

Extended HALS algorithm for nonnegative Tucker decomposition and its applications for multiway analysis and classification

Analysis of high dimensional data in modern applications, such as neuroscience, text mining, spectral analysis or chemometrices naturally requires tensor decomposition methods. The Tucker decompositions allow us to extract hidden factors (component matrices) with a different dimension in each mode and investigate interactions among various modes. The Alternating Least Squares (ALS) algorithms h...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011